The ability to predict the activities of users is an important one for recommender systems\nand analyses of social media. User activities can be represented in terms of relationships\ninvolving three or more things (e.g. when a user tags items on a webpage or tweets about\na location he or she visited). Such relationships can be represented as a tensor, and tensor\nfactorization is becoming an increasingly important means for predicting users� possible\nactivities. However, the prediction accuracy of factorization is poor for ambiguous and/or\nsparsely observed objects. Our solution, Semantic Sensitive Tensor Factorization (SSTF),\nincorporates the semantics expressed by an object vocabulary or taxonomy into the tensor\nfactorization. SSTF first links objects to classes in the vocabulary (taxonomy) and resolves\nthe ambiguities of objects that may have several meanings. Next, it lifts sparsely observed\nobjects to their classes to create augmented tensors. Then, it factorizes the original tensor\nand augmented tensors simultaneously. Since it shares semantic knowledge during the\nfactorization, it can resolve the sparsity problem. Furthermore, as a result of the natural\nuse of semantic information in tensor factorization, SSTF can combine heterogeneous and\nunbalanced datasets from different Linked Open Data sources. We implemented SSTF in the\nBayesian probabilistic tensor factorization framework. Experiments on publicly available\nlarge-scale datasets using vocabularies from linked open data and a taxonomy from\nWordNet show that SSTF has up to 12% higher accuracy in comparison with state-of-the-art\ntensor factorization methods.
Loading....